AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Multilingual Knowledge Distillation

# Multilingual Knowledge Distillation

Yixin Distill Qwen 72B
Apache-2.0
A high-performance distilled model optimized for mathematics and general reasoning, refined from Qwen2.5-72B through reinforcement learning
Large Language Model Safetensors Supports Multiple Languages
Y
YiXin-AILab
38
26
Shlm Grc En
MIT
This model creates sentence embeddings for Ancient Greek and English texts in a shared vector space, based on an improved HLM architecture and trained through multilingual knowledge distillation.
Text Embedding Transformers Supports Multiple Languages
S
kevinkrahn
62
2
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase